The goal is then to select the class with maximum a posteriori probability, by considering the ratio
2.
Equation for f ( m | n ) looks like the Bayes formula for a posteriori probabilities; if l ( n | m ) in the result of learning become conditional likelihoods, f ( m | n ) become Bayesian probabilities for signal n originating from object m.